Discussion of “ Least Angle Regression ” by Efron
نویسنده
چکیده
I would like to begin by congratulating the authors (referred to below as EHJT) for their interesting paper in which they propose a new variable selection method (LARS) for building linear models and show how their new method relates to other methods that have been proposed recently. I found the paper to be very stimulating and found the additional insight that it provides about the Lasso technique to be of particular interest. My comments center around the question of how we can select linear models that conform with the marginality principle [Nelder (1977, 1994) and McCullagh and Nelder (1989)]; that is, the response surface is invariant under scaling and translation of the explanatory variables in the model. Recently one of my interests was to explore whether the Lasso technique or the nonnegative garrote [Breiman (1995)] could be modified such that it incorporates the marginality principle. However, it does not seem to be a trivial matter to change the criteria that these techniques minimize in such a way that the marginality principle is incorporated in a satisfactory manner. On the other hand, it seems to be straightforward to modify the LARS technique to incorporate this principle. In their paper, EHJT address this issue somewhat in passing when they suggest toward the end of Section 3 that one first fit main effects only and interactions in a second step to control the order in which variables are allowed to enter the model. However, such a two-step procedure may have a somewhat less than optimal behavior as the following, admittedly artificial, example shows. Assume we have a vector of explanatory variables X = (X1,X2, . . . ,X10) where the components are independent of each other and Xi, i= 1, . . . ,10, follows a uniform distribution on [0,1]. Take as model
منابع مشابه
Discussion of “ Least Angle Regression ” by Efron
Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm and the relative complexity of more...
متن کاملDiscussion of Least Angle Regression
Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm, and the relative complexity of mor...
متن کاملDiscussion of “ Least Angle Regression ” by Efron
Most of this article concerns the uses of LARS and the two related methods in the age-old, " somewhat notorious, " problem of " [a]utomatic model-building algorithms.. . " for linear regression. In the following, I will confine my comments to this notorious problem and to the use of LARS and its relatives to solve it. 1. The implicit assumption. Suppose the response is y, and we collect the m p...
متن کاملDiscussion of “ Least Angle Regression ” by Efron
I have enjoyed reading the work of each of these authors over the years, so it is a real pleasure to have this opportunity to contribute to the discussion of this collaboration. The geometry of LARS furnishes an elegant bridge between the Lasso and Stagewise regression, methods that I would not have suspected to be so related. Toward my own interests, LARS offers a rather different way to const...
متن کاملForward stagewise regression and the monotone lasso
Abstract: We consider the least angle regression and forward stagewise algorithms for solving penalized least squares regression problems. In Efron, Hastie, Johnstone & Tibshirani (2004) it is proved that the least angle regression algorithm, with a small modification, solves the lasso regression problem. Here we give an analogous result for incremental forward stagewise regression, showing tha...
متن کاملAn ordinary differential equation based solution path algorithm.
Efron, Hastie, Johnstone and Tibshirani (2004) proposed Least Angle Regression (LAR), a solution path algorithm for the least squares regression. They pointed out that a slight modification of the LAR gives the LASSO (Tibshirani, 1996) solution path. However it is largely unknown how to extend this solution path algorithm to models beyond the least squares regression. In this work, we propose a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004